Skip to content

Conversation

JaredforReal
Copy link
Collaborator

@JaredforReal JaredforReal commented Oct 16, 2025

What this PR does / why we need it:

  • fix python pre-commit: isort and black conflict
  • add MiniLM-L12-v2 to make download-models & use models/all-MiniLM-L12-v2 in config.yaml
  • update docker-compose-down & add docker-compose-down-core & add docker-compose-down-testing & add docker-compose-down-llm-katan
  • fix CI test and build error

fix: #438 #448

Copy link

netlify bot commented Oct 16, 2025

Deploy Preview for vllm-semantic-router ready!

Name Link
🔨 Latest commit 952d474
🔍 Latest deploy log https://app.netlify.com/projects/vllm-semantic-router/deploys/68f099c3825c8a0008049475
😎 Deploy Preview https://deploy-preview-450--vllm-semantic-router.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

Copy link

github-actions bot commented Oct 16, 2025

👥 vLLM Semantic Team Notification

The following members have been identified for the changed files in this PR and have been automatically assigned:

📁 Root Directory

Owners: @rootfs, @Xunzhuo
Files changed:

  • .github/workflows/docker-publish.yml
  • .github/workflows/test-and-build.yml
  • .pre-commit-config.yaml

📁 config

Owners: @rootfs
Files changed:

  • config/config-mcp-classifier-example.yaml
  • config/config.development.yaml
  • config/config.e2e.yaml
  • config/config.production.yaml
  • config/config.recipe-accuracy.yaml
  • config/config.recipe-latency.yaml
  • config/config.recipe-token-efficiency.yaml
  • config/config.testing.yaml
  • config/config.yaml

📁 e2e-tests

Owners: @yossiovadia
Files changed:

  • e2e-tests/llm-katan/Dockerfile

📁 src

Owners: @rootfs, @Xunzhuo, @wangchen615
Files changed:

  • src/training/training_lora/classifier_model_fine_tuning_lora/ft_linear_lora.py
  • src/training/training_lora/classifier_model_fine_tuning_lora/ft_qwen3_generative_lora.py
  • src/training/training_lora/pii_model_fine_tuning_lora/pii_bert_finetuning_lora.py
  • src/training/training_lora/prompt_guard_fine_tuning_lora/jailbreak_bert_finetuning_lora.py

📁 tools

Owners: @yuluo-yx, @rootfs, @Xunzhuo
Files changed:

  • tools/make/docker.mk
  • tools/make/models.mk

vLLM

🎉 Thanks for your contributions!

This comment was automatically generated based on the OWNER files in the repository.

@JaredforReal
Copy link
Collaborator Author

@yuluo-yx check this out

Signed-off-by: JaredforReal <[email protected]>
Signed-off-by: JaredforReal <[email protected]>
@JaredforReal JaredforReal changed the title fix: python pre-commit conflict & add MiniLM-L12-v2 & docker-compose-down fix: CI error & pre-commit & add MiniLM-L12-v2 & docker-compose-down Oct 16, 2025
fail-fast: false

steps:
- name: Free up disk space
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before cleanup:
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        72G   50G   23G  70% /
tmpfs           7.9G   84K  7.9G   1% /dev/shm
tmpfs           3.2G  1.1M  3.2G   1% /run
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sdb16      881M   60M  760M   8% /boot
/dev/sdb15      105M  6.2M   99M   6% /boot/efi
/dev/sda1        74G  4.1G   66G   6% /mnt
tmpfs           1.6G   12K  1.6G   1% /run/user/1001
Total reclaimed space: 0B

After cleanup:
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        72G   35G   37G  49% /
tmpfs           7.9G   84K  7.9G   1% /dev/shm
tmpfs           3.2G  1.1M  3.2G   1% /run
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sdb16      881M   60M  760M   8% /boot
/dev/sdb15      105M  6.2M   99M   6% /boot/efi
/dev/sda1        74G  4.1G   66G   6% /mnt
tmpfs           1.6G   12K  1.6G   1% /run/user/1001

Looks like 15G was cleared.
Does this help with docker builds? 👀

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't see #448, please ignore this comment

Copy link
Contributor

@yuluo-yx yuluo-yx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Xunzhuo Xunzhuo merged commit 75cd042 into vllm-project:main Oct 16, 2025
16 checks passed
@JaredforReal JaredforReal deleted the fixes branch October 17, 2025 15:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Deploy: docker compose start failed

6 participants